Bagging and Boosting performance in Projection Pursuit Regression
نویسندگان
چکیده
Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers on artificial and real data sets. The goal is to assemble a collection of individual classifiers based on resampling of data set. Bagging (Breiman, 1996) and AdaBoost (Freund & Schapire, 1997) are the most used procedures: the first fits many classifiers to bootstrap samples of data and classifies the units by the majority vote; the second fits many classifiers weighing the data and classifies the units by the weighted majority vote. The success of these methods is in terms of the bias-variance components of the generalization error. In the regression context the application of these techniques has received little investigation. Drucker (1997) and other authors modified AdaBoost to use regression trees as predictors. Our aim is to verify by simulations if boosting or bagging can improve the capability to reduce conjointly training set error and generalization error using Projection Pursuit Regressions as predictors.
منابع مشابه
Improving Regressors using Boosting Techniques
In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all cases, boosting is at least equivalent, and...
متن کاملImproving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran
An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...
متن کاملBoosting and Bagging of Neural Networks with Applications to Financial Time Series
Boosting and bagging are two techniques for improving the performance of learning algorithms. Both techniques have been successfully used in machine learning to improve the performance of classification algorithms such as decision trees, neural networks. In this paper, we focus on the use of feedforward back propagation neural networks for time series classification problems. We apply boosting ...
متن کاملCombining Bagging and Additive Regression
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in t...
متن کاملL1 regularized projection pursuit for additive model learning
In this paper, we present a L1 regularized projection pursuit algorithm for additive model learning. Two new algorithms are developed for regression and classification respectively: sparse projection pursuit regression and sparse Jensen-Shannon Boosting. The introduced L1 regularized projection pursuit encourages sparse solutions, thus our new algorithms are robust to overfitting and present be...
متن کامل